Goto

Collaborating Authors

 factorial learning


Factorial Learning by Clustering Features

Neural Information Processing Systems

We introduce a novel algorithm for factorial learning, motivated by segmentation problems in computational vision, in which the underlying factors correspond to clusters of highly correlated input features. The algorithm derives from a new kind of competitive clustering model, in which the cluster generators compete to ex(cid:173) plain each feature of the data set and cooperate to explain each input example, rather than competing for examples and cooper(cid:173) ating on features, as in traditional clustering algorithms. A natu(cid:173) ral extension of the algorithm recovers hierarchical models of data generated from multiple unknown categories, each with a differ(cid:173) ent, multiple causal structure. Several simulations demonstrate the power of this approach.


Factorial Learning and the EM Algorithm

Neural Information Processing Systems

Many real world learning problems are best characterized by an interaction of multiple independent causes or factors. Discover(cid:173) ing such causal structure from the data is the focus of this paper. Based on Zemel and Hinton's cooperative vector quantizer (CVQ) architecture, an unsupervised learning algorithm is derived from the Expectation-Maximization (EM) framework. Due to the com(cid:173) binatorial nature of the data generation process, the exact E-step is computationally intractable. Two alternative methods for com(cid:173) puting the E-step are proposed: Gibbs sampling and mean-field approximation, and some promising empirical results are presented.


On a Modification to the Mean Field EM Algorithm in Factorial Learning

Neural Information Processing Systems

A modification is described to the use of mean field approxima(cid:173) tions in the E step of EM algorithms for analysing data from latent structure models, as described by Ghahramani (1995), among oth(cid:173) ers. The modification involves second-order Taylor approximations to expectations computed in the E step. The potential benefits of the method are illustrated using very simple latent profile models.


Learning About Multiple Objects in Images: Factorial Learning without Factorial Search

Neural Information Processing Systems

We consider data which are images containing views of multiple objects. Our task is to learn about each of the objects present in the images. This task can be approached as a factorial learning problem, where each image must be explained by instantiating a model for each of the objects present with the correct instantiation parameters. A major problem with learning a factorial model is that as the number of objects increases, there is a combinatorial explosion of the number of configurations that need to be considered. We develop a method to extract object models sequentially from the data by making use of a robust statistical method, thus avoid- ing the combinatorial explosion, and present results showing successful extraction of objects from real images.


Learning About Multiple Objects in Images: Factorial Learning without Factorial Search

Williams, Christopher K. I., Titsias, Michalis K.

Neural Information Processing Systems

We consider data which are images containing views of multiple objects. Our task is to learn about each of the objects present in the images. This task can be approached as a factorial learning problem, where each image must be explained by instantiating a model for each of the objects present with the correct instantiation parameters. A major problem with learning a factorial model is that as the number of objects increases, there is a combinatorial explosion of the number of configurations that need to be considered. We develop a method to extract object models sequentially from the data by making use of a robust statistical method, thus avoiding thecombinatorial explosion, and present results showing successful extraction of objects from real images.


Learning About Multiple Objects in Images: Factorial Learning without Factorial Search

Williams, Christopher K. I., Titsias, Michalis K.

Neural Information Processing Systems

We consider data which are images containing views of multiple objects. Our task is to learn about each of the objects present in the images. This task can be approached as a factorial learning problem, where each image must be explained by instantiating a model for each of the objects present with the correct instantiation parameters. A major problem with learning a factorial model is that as the number of objects increases, there is a combinatorial explosion of the number of configurations that need to be considered. We develop a method to extract object models sequentially from the data by making use of a robust statistical method, thus avoiding the combinatorial explosion, and present results showing successful extraction of objects from real images.


Learning About Multiple Objects in Images: Factorial Learning without Factorial Search

Williams, Christopher K. I., Titsias, Michalis K.

Neural Information Processing Systems

We consider data which are images containing views of multiple objects. Our task is to learn about each of the objects present in the images. This task can be approached as a factorial learning problem, where each image must be explained by instantiating a model for each of the objects present with the correct instantiation parameters. A major problem with learning a factorial model is that as the number of objects increases, there is a combinatorial explosion of the number of configurations that need to be considered. We develop a method to extract object models sequentially from the data by making use of a robust statistical method, thus avoiding the combinatorial explosion, and present results showing successful extraction of objects from real images.


Factorial Learning and the EM Algorithm

Ghahramani, Zoubin

Neural Information Processing Systems

Many real world learning problems are best characterized by an interaction of multiple independent causes or factors. Discovering suchcausal structure from the data is the focus of this paper. Based on Zemel and Hinton's cooperative vector quantizer (CVQ) architecture, an unsupervised learning algorithm is derived from the Expectation-Maximization (EM) framework. Due to the combinatorial natureof the data generation process, the exact E-step is computationally intractable. Two alternative methods for computing theE-step are proposed: Gibbs sampling and mean-field approximation, and some promising empirical results are presented.


Factorial Learning by Clustering Features

Tenenbaum, Joshua B., Todorov, Emanuel V.

Neural Information Processing Systems

We introduce a novel algorithm for factorial learning, motivated by segmentation problems in computational vision, in which the underlying factors correspond to clusters of highly correlated input features. The algorithm derives from a new kind of competitive clustering model, in which the cluster generators compete to explain eachfeature of the data set and cooperate to explain each input example, rather than competing for examples and cooperating onfeatures, as in traditional clustering algorithms. A natural extension of the algorithm recovers hierarchical models of data generated from multiple unknown categories, each with a different, multiplecausal structure. Several simulations demonstrate the power of this approach.


Factorial Learning and the EM Algorithm

Ghahramani, Zoubin

Neural Information Processing Systems

Many real world learning problems are best characterized by an interaction of multiple independent causes or factors. Discovering such causal structure from the data is the focus of this paper. Based on Zemel and Hinton's cooperative vector quantizer (CVQ) architecture, an unsupervised learning algorithm is derived from the Expectation-Maximization (EM) framework. Due to the combinatorial nature of the data generation process, the exact E-step is computationally intractable. Two alternative methods for computing the E-step are proposed: Gibbs sampling and mean-field approximation, and some promising empirical results are presented.